efore, some updates have been used to correct the original

r-Henderson algorithm. In these updates, the fidelity has been

sing the following method, which employs a decaying function

[Mazet, et al., 2005; Cobas, et al., 2006; Zhang, et al., 2010].

݁ൌ൜

0

݂݅ݏ൐ܾ

݂ሺݏ, ܾ

other wise

(5.3)

s scenario, only the negative errors are treated as noise while the

errors are ignored when estimating the baseline for a spectrum

s type of approaches. After a baseline has been estimated and

from a spectrum, the artifacts may be present. The artifacts

d in a baseline-removed spectrum are potential noise. They must

lly dealt with in order to avoid them to be mistakenly treated as

A smoothing process is commonly required to combat the artifacts

urpose of discovering the signals as accurately as possible from a

removed peak spectrum.

detecting signals from a spectrum through estimating a baseline

contaminated by noise, a problem thus arises. If a model is too

e, some true signals may be treated as noise. Otherwise, some

nals may appear. It is therefore a problem of how to trade-off

the true signal discovery rate and the false signal discovery rate.

mal strategy is, of course, to maximise the true signal discovery

at the same time to minimise the false signal discovery rate.

s a version of a regularised regression model. In a WH model, the

ation constant plays an important rule to trade-off between the

al discovery rate and the false signal discovery rate. As shown in

tion (5.1), when the regularisation constant ߣ is too small, an

d baseline will not be sufficiently smooth. For instance, if ߣൌ0,

In this case, no smoothness is considered. An estimated baseline

ll tend to fit to every detail including signal intensities. The

nce is ܛെ܊→૙, i.e., almost no signals at all. If ߣൌ∞, ܱൌ

this case, an estimated baseline will be a straight line. The

nce is that many true signals may be missed while many false

may appear. Therefore, this regularisation constant must be very